Navigating educational institutions can be complex, especially for newcomers. This paper suggests an Artificial Intelligence-based Navigation System employing an Autonomous Robot with the vision of presenting a real-time, efficient, and user-centric navigation system. Using LiDAR technology and artificial intelligence, the robot can detect obstacles and map the environment to provide continuous path directions. The system comes with a graphical user interface developed using Streamlit, supports both traditional and advanced pathfinding algorithms, and applies supervised learning techniques based on pre-trained datasets containing more than 600 A* paths. It is intended to operate within school campuses and promises accessibility, enhances work efficiency, and offers the foundation for larger-scale applications within public spaces.
Introduction
Contemporary college campuses are busy, complex environments where navigation can be challenging, especially for new students and visitors during peak periods like admissions or conferences. To improve campus experience, an AI-powered autonomous robot navigation system is proposed. This system uses advanced pathfinding algorithms and real-time environmental sensing to guide users seamlessly via simple web or mobile interfaces. It adapts dynamically to obstacles, construction, or congestion, while also serving as a centralized digital hub for campus information and accessibility features.
The system aligns with the vision of a "smart campus" that integrates digital technologies for improved efficiency, security, and user satisfaction. It reduces human workload by automating navigation and informational services but requires attention to data privacy and infrastructure scalability for long-term success.
The literature survey reviews various studies on indoor navigation, autonomous robots, LiDAR sensing, and AI algorithms, highlighting advancements and gaps in current technologies. These include multi-floor indoor robots, GPS and sensor fusion techniques, semantic mapping, mobile apps for campus navigation, and obstacle detection methods. Most prior works focus on isolated aspects and lack comprehensive AI integration for real-time autonomous campus navigation.
The methodology section outlines a hybrid software and hardware system combining image processing, graph theory, classical pathfinding algorithms (A*, Dijkstra, BFS, DFS), and AI-based predictive modeling to optimize routing in mapped environments. Data from thousands of computed paths trains machine learning models to enhance future navigation decisions, visualized in real time through a user interface.
Overall, the research showcases how AI and autonomous robotics can revolutionize campus navigation by creating intelligent, adaptive, and user-friendly systems that meet diverse campus needs and improve accessibility, operational efficiency, and user satisfaction.
Conclusion
Artificial Intelligence based Navigation System Using Autonomous Robot are becoming valuable tools for improving navigation, accessibility, and the overall user experience on university campuses. These robots use advanced technologies such as LiDAR and Inertial Measurement Units (IMUs) to provide efficient, real-time guidance, which is essential for navigating complex environments. One of their main roles is to assist visitors, students, faculty, and guests by offering reliable navigation support. Additionally, they can streamline campus operations by handling routine navigation-related tasks. With ongoing advancements in robotics and artificial intelligence, the future of Artificial Intelligence based Navigation System Using Autonomous Robot looks promising, indicating potential improvements in educational environments and other public spaces.
The ensuing AI-driven navigation system is an efficient solution to navigation issues on a campus. Its hardware and cognitive software integration offers real-time autonomous navigation. Areas of future work are: Multilingual speech guidance integration, real-time rerouting feature, extension to areas such as airports, hospitals, and museums and scalability through cloud-based path training.
References
[1] Anca Morar, Alin Moldoveanu, Irina Mocanu, Florica Moldoveanu, Ion Emilian Radoi, Victor Asavei, Alexandru Gradinaru and Alex Butean, A Comprehensive Survey of Indoor Localization Methods, National Library of Medicine, Sensors (Basel), Vol. 20, pp. 1-36, 2020.
[2] Bimal Paneru, Niraj Basnet, Sagar Shrestha, Rabin Giri and Dinesh Baniya Kshatri, Autonomous Navigation of a Mobile Robot in Indoor Environments, Zerone Scholar, Vol. 1, No. 1, pp. 3-8, Nov 2016.
[3] Cristian MOLDER, Daniel TOMA, and Andrei ?IG?U, Navigation Algorithms with LIDAR for Mobile Robots, Journal of Military Technology, Vol. 2, No. 1, pp. 5-10, 2019.
[4] Daegyu Lee, Gyuree Kang, Boseongkim and D. Hyunchul Shim, Assistive Delivery Robot Application for Real-World Postal Services, IEEE Access, Vol. 9, pp. 141981- 141998, Oct 2021.
[5] Minghao Liu, Zhixing Hou, Zezhou Sun, Ning Yin, Hang Yang, Ying Wang, Zhiqiang Chu and Hui Kong, Campus Guide: A Lidar-based Mobile Robot, European Conference on Mobile Robots (ECMR), pp. 1-6, 2019.
[6] Nicky Zimmerman, Matteo Sodano, Elias Marks, Jens Behley and Cyrill Stachniss, Constructing Metric-Semantic Maps Using Floor Plan Priors for Long-Term Indoor Localization, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1366-1372, 2023.
[7] Rabab Alayham Abbas Helmi, Harini A.P. Ravichandran, Arshad Jamal and M. N. Mohammed, Design and Development of Indoor Campus Navigation Application. IEEE Conference on Systems, Process & Control (ICSPC), pp. 77-82, 2022.
[8] Sheryl Sharon G, Rohit Vikaas P, Chanduru A, Barathkumar S, Harsha Vardhan P. and Dr. M. Mohanapriya, Coimbatore Institute of Technology Campus Navigation System Version 1.0, International Journal for Research in Applied Science & Engineering Technology (IJRASET), Vol. 11, pp. 1121- 1127, Oct 2023.
[9] Sithara Jeyaraj and Jincy Jose, Indoor Navigation for Shopping Robot, International Journal for Research in Engineering Application & Management (IJREAM), pp. 316-322, July 2018.
[10] Susovan Jana and Matangini Chattopadhyay, An event-driven university campus navigation system on android platform, Applications and Innovations in Mobile Computing (AIMoC), pp. 182-187, 2015
[11] Taejin Kim, Daegyu Lee, Gyuree Kang and D. Hyunchul Shim, Development of an Indoor Delivery Mobile Robot for a Multi-Floor Environment, IEEE Access, Vol. 12, pp. 45202- 45215, March 2024
[12] He Zhao and Zheyao Wang, Motion Measurement using Inertial Sensors, Ultrasonic sensors and Magnetometers with extended Kalman Filter for Data Fusion, IEEE Sensors Journal, Vol.12, No.5, May 2012.
[13] Chen Qiu, Matt W. Mutka, iFrame Dynamic indoor map construction through automatic mobile sensing, Pervasive and Mobile Computing, 2017. DOI: 10.1016/j.pmcj.2016.12.008